Neural Decoding With Optimization of Node Activations

نویسندگان

چکیده

The problem of maximum likelihood decoding with a neural decoder for error-correcting code is considered. It shown that the can be improved two novel loss terms on node’s activations. first term imposes sparse constraint Whereas, second tried to mimic activations from teacher which has better performance. proposed method same run time complexity and model size as Belief Propagation decoder, while improving performance by up $1.1dB$ BCH codes.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Path-Normalized Optimization of Recurrent Neural Networks with ReLU Activations

We investigate the parameter-space geometry of recurrent neural networks (RNNs), and develop an adaptation of path-SGD optimization method, attuned to this geometry, that can learn plain RNNs with ReLU activations. On several datasets that require capturing long-term dependency structure, we show that path-SGD can significantly improve trainability of ReLU RNNs compared to RNNs trained with SGD...

متن کامل

Reconstructing perceived faces from brain activations with deep adversarial neural decoding

Here, we present a novel approach to solve the problem of reconstructing perceived stimuli from brain responses by combining probabilistic inference with deep learning. Our approach first inverts the linear transformation from latent features to brain responses with maximum a posteriori estimation and then inverts the nonlinear transformation from perceived stimuli to latent features with adver...

متن کامل

Automatic Attribute Discovery with Neural Activations

How can a machine learn to recognize visual attributes emerging out of online community without a definitive supervised dataset? This paper proposes an automatic approach to discover and analyze visual attributes from a noisy collection of image-text data on the Web. Our approach is based on the relationship between attributes and neural activations in the deep network. We characterize the visu...

متن کامل

Adaptation of Neural Networks Constrained by Prior Statistics of Node Co-Activations

We propose a novel unsupervised model adaptation framework in which a neural network uses prior knowledge of the statistics of its output and hidden layer activations to update its parameters online to improve performance in mismatched environments. This idea is inspired by biological neural networks, which use feedback to dynamically adapt their computation when faced with unexpected inputs. H...

متن کامل

Decoding as Continuous Optimization in Neural Machine Translation

We propose a novel decoding approach for neural machine translation (NMT) based on continuous optimisation. The resulting optimisation problem is then tackled using constrained gradient optimisation. Our powerful decoding framework, enables decoding intractable models such as the intersection of left-to-right and rightto-left (bidirectional) as well as sourceto-target and target-to-source (bili...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Communications Letters

سال: 2022

ISSN: ['1558-2558', '1089-7798', '2373-7891']

DOI: https://doi.org/10.1109/lcomm.2022.3197974